Costruzione dell'Atlantic Multi-decadal Oscillation (AMO) Index e confronto con le temperature medie annuali della Groenlandia.¶

Claudio Fadda (mat. 813499) - Progetto del corso in Big Data in Geographic Information Systems¶

La AMO è una oscillazione delle temperature superficiali nel Nord Atlantico, in particolare indica le oscillazioni ogni 60-80 anni che occorrono nel tratto di oceano compreso tra l'Equatore e la Groenlandia.

La regione della Groenlandia, soprattutto nell'ultimo decennio, è nota per essere molto influenzata dai cambiamenti climatici, l'aumento delle temperature e in particolare lo scioglimento dei ghiacciai può avere conseguenze devastanti per l'intero globo.

In questo notebook si vuole andare a studiare, se esiste una relazione tra temperature della Groenlandia e l'AMO index.

In [69]:
# importazione delle librerie

import os

import numpy as np
import pandas as pd
import geopandas as gpd

import datetime as dtm
from datetime import datetime

import matplotlib.pyplot as plt
import cartopy.crs as ccrs

import scipy as sp
from scipy import stats
import seaborn as sns

import xarray as xr

from PIL import Image
In [70]:
flist = [os.path.join(path, name) for path, subdirs, files in os.walk("./CRUTEM.4.6.0.0.station_files/") for name in files]

flist=flist[1:]
print ('\n',flist[0:5])

nst=len(flist)
 ['./CRUTEM.4.6.0.0.station_files/CRUTEM.4.6.0.0.station_files\\01\\010010', './CRUTEM.4.6.0.0.station_files/CRUTEM.4.6.0.0.station_files\\01\\010030', './CRUTEM.4.6.0.0.station_files/CRUTEM.4.6.0.0.station_files\\01\\010050', './CRUTEM.4.6.0.0.station_files/CRUTEM.4.6.0.0.station_files\\01\\010070', './CRUTEM.4.6.0.0.station_files/CRUTEM.4.6.0.0.station_files\\01\\010080']

Prima parte: creazione di una serie temporale relativa alle anomalie di temperatura in Groenlandia.¶

In questa prima parte si andrà a ricalcare la procedura eseguita da CRUTEM per andare a creare la serie storica relativa alle anomalie di temperature nella zona della Groenlandia.

In [71]:
metadati = pd.DataFrame(columns=['ID','nome','paese','altezza','lat','lon'])
In [72]:
# creazione di un asse temporale che va da gennaio del 1850 al dicembre 2014
asset = pd.date_range('1850-01', '2015-01', freq='M')
nmonths=len(asset)
nyears=nmonths/12

#dati dalle stazioni.
dati = pd.DataFrame({'time': asset})
dati = dati.set_index(['time'])
In [73]:
# Come periodo di riferimento prendiamo il periodo tra il 1961 e il 1990
yref0 = 1961
yref1 = 1990
In [74]:
# caricamento delle osservazioni dalle stazioni

nst_tmp = nst

nodatacount=0
tooshortcount=0
outsidecount=0

from tqdm import tqdm

for si in tqdm(range(0,nst_tmp)):
    
    filein=flist[si]

    with open(filein) as f: data = f.readlines()
    skipi = data.index("Obs:\n")+1
    if (len(data)-skipi < 2):
        nodatacount+=1
        continue
        
    yr = np.genfromtxt(filein, skip_header=skipi, delimiter=None, usecols=0, dtype='i4')
    nyr = len(yr)
    
    # primo filtro (se < di 30 anni scarto il file)
    if (nyr < 30):
        tooshortcount+=1
        continue
        
    # secondo filtro (country), prendiamo solo le stazioni relative alla Groenlandia
    stcountry_line = [line for line in data if "Country=" in line]
    stcountry = str(stcountry_line).split("=",1)[1]
    stcountry = stcountry[:-4].strip()
    
    if (stcountry != 'GREENLAND'):
        continue
        
    #terzo filtro (se presenti valori null)
    norm_line = [line for line in data if "Normals=" in line]
    norm = str(norm_line).split("=",1)[1]
    norm = norm[:-4].strip()

    if (norm.find("-99.0") > -1): 
        outsidecount+=1
        continue
        
    stname_line = [line for line in data if "Name=" in line]
    stname = str(stname_line).split("=",1)[1]
    stname = stname[:-4].strip()
    
    country_line = [line for line in data if "Country=" in line]
    country = str(country_line).split("=",1)[1]
    country = country[:-4].strip()
    
    elev_line = [line for line in data if "Height=" in line]
    elev = str(elev_line).split("=",1)[1]
    elev = float(elev[:-4].strip())
        
    lat_line = [line for line in data if "Lat=" in line]
    lat = str(lat_line).split("=",1)[1]
    lat = float(lat[:-4].strip())
    
    lon_line = [line for line in data if "Long=" in line]
    lon = str(lon_line).split("=",1)[1]
    lon = float(lon[:-4].strip())

    
    metadati = metadati.append({
        "ID": filein[-6:],
        "nome": stname,
        "paese": country,
        "altezza": elev,
        "lat": lat,
        "lon": lon,
    }, ignore_index=True)

    ti0 = datetime.strptime(str(yr[0]), "%Y")
    ti1 = datetime.strptime(str(yr[nyr-1]+1), "%Y")
    

    stime = pd.date_range(ti0, ti1, freq='M')
    
        
    loc_data_2d = np.genfromtxt(filein, skip_header=skipi, filling_values='-99.0', delimiter=None, dtype='float')[:,1:13]
    loc_data_1d = loc_data_2d.flatten()
    
    
    xdf = pd.DataFrame({'time': stime, filein[-6:]: np.asarray(loc_data_1d)})
    xdf = xdf.set_index('time')
    
    dati = pd.merge(dati, xdf, on='time', how='left')
 10%|█         | 1037/10295 [00:11<01:43, 89.83it/s]C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\1659791162.py:67: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  metadati = metadati.append({
C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\1659791162.py:67: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  metadati = metadati.append({
C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\1659791162.py:67: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  metadati = metadati.append({
 10%|█         | 1047/10295 [00:11<02:12, 69.62it/s]C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\1659791162.py:67: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  metadati = metadati.append({
C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\1659791162.py:67: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  metadati = metadati.append({
C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\1659791162.py:67: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  metadati = metadati.append({
 10%|█         | 1055/10295 [00:11<02:46, 55.66it/s]C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\1659791162.py:67: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  metadati = metadati.append({
C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\1659791162.py:67: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  metadati = metadati.append({
C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\1659791162.py:67: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  metadati = metadati.append({
 10%|█         | 1062/10295 [00:11<03:12, 47.95it/s]C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\1659791162.py:67: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  metadati = metadati.append({
C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\1659791162.py:67: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  metadati = metadati.append({
C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\1659791162.py:67: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  metadati = metadati.append({
 10%|█         | 1068/10295 [00:12<03:44, 41.15it/s]C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\1659791162.py:67: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  metadati = metadati.append({
 10%|█         | 1075/10295 [00:12<03:19, 46.19it/s]C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\1659791162.py:67: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  metadati = metadati.append({
C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\1659791162.py:67: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  metadati = metadati.append({
C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\1659791162.py:67: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  metadati = metadati.append({
C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\1659791162.py:67: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  metadati = metadati.append({
 11%|█         | 1081/10295 [00:12<03:45, 40.84it/s]C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\1659791162.py:67: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  metadati = metadati.append({
C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\1659791162.py:67: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  metadati = metadati.append({
 11%|█         | 1086/10295 [00:12<03:41, 41.62it/s]C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\1659791162.py:67: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  metadati = metadati.append({
 16%|█▌        | 1657/10295 [00:18<01:28, 97.82it/s] C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\1659791162.py:67: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.
  metadati = metadati.append({
100%|██████████| 10295/10295 [00:50<00:00, 204.78it/s]
In [75]:
# stazioni prese in considerazione
metadati
Out[75]:
ID nome paese altezza lat lon
0 042020 PITUFFIK/THULE USAF GREENLAND 70.0 76.5 68.8
1 042100 UPERNAVIK GREENLAND 63.0 72.8 56.2
2 042110 UPERNAVIK GREENLAND 63.0 72.8 56.2
3 042180 QEQERTARSUAQ GREENLAND 24.0 69.2 53.5
4 042200 EGEDESMINDE GREENLAND 47.0 68.7 52.8
5 042210 ILULISSAT/JAKOBSHAVN GREENLAND 31.0 69.2 51.1
6 042300 SISIMIUT GREENLAND 12.0 66.9 53.7
7 042310 KANGERLUSSUAQ (SDR. GREENLAND 53.0 67.0 50.7
8 042500 GODTHAB/NUUK GREENLAND 50.0 64.2 51.8
9 042600 PAAMIUT (FREDERIKSH GREENLAND 15.0 62.0 49.7
10 042700 MITTARFIK/NARSARSUAQ GREENLAND 31.0 61.2 45.4
11 042720 JULIANEHAB/QAQORTOQ GREENLAND 32.0 60.7 46.1
12 043100 NORD GREENLAND 35.0 81.5 16.8
13 043200 DANMARKSHAVN GREENLAND 11.0 76.8 18.8
14 043300 DANEBORG GREENLAND 44.0 74.3 20.2
15 043390 SCORESBYSUND GREENLAND 65.0 70.5 22.0
16 043400 KAP TOBIN GREENLAND 41.0 70.5 22.0
17 043500 APUTITEEQ GREENLAND 20.0 67.8 32.3
18 043600 AMMASALIK GREENLAND 50.0 65.6 37.6
19 043900 PRINS CHRISTIANS S. GREENLAND 76.0 60.0 43.1
20 099999 SW COMBINED SERIES GREENLAND -999.0 -99.9 -999.9

Prendendo in considerazione solamente la Groenlandia sono state trovate un totale di 20 stazioni, notiamo che l'ultima stazione con ID 099999 non è ben definita e quindi la eliminiamo.

In [76]:
metadati = metadati.drop(labels=[20], axis=0)
dati = dati.drop(labels=['099999'], axis=1)

Conversione dei missing value in NaN

In [77]:
dati[dati == -99.0] = np.nan
metadati = metadati.set_index(["ID"])
In [78]:
# correzione della longitudine
metadati.lon = -metadati.lon

Verifichiamo che tutte le stazioni prese in considerazione siano posizionate nella Groenlandia.

In [79]:
ax = plt.axes(projection=ccrs.PlateCarree())
ax.coastlines()
ax.stock_img()
ax.scatter([metadati.lon],[metadati.lat],color='r',marker='o',s=1.0)
plt.show()

Verifichiamo stazione per stazione le temperature assolute.

In [80]:
fig = plt.figure(figsize=(13,19))

no = 20

subplots = (no,2)
n_panels = subplots[0] * subplots[1]

proj = ccrs.PlateCarree()

ssite = metadati.index

for fi, f in enumerate(ssite):

    ax = fig.add_subplot(subplots[0], subplots[1], (fi*2)+1, projection=proj)
    ax.set_title(' - '.join([metadati.nome[ssite[fi]],metadati.paese[ssite[fi]]]))
    ax.stock_img()
    ax.coastlines()
    plt.plot(metadati.lon[ssite[fi]], metadati.lat[ssite[fi]],
        color='red', marker='o', markersize=5,
        transform=ccrs.Geodetic())

    tser = fig.add_subplot(subplots[0], subplots[1], (fi+1)*2)

    plt.plot(dati[ssite[fi]])
    
fig.tight_layout()
plt.show()
    

Dal plot vediamo tutte le stazioni che hanno registrato dati relativi alle temperature in Groenlandia. Ciascuna stazione ha cominciato a registrare i dati in anni diversi, questo significa che non si può avere una copertura totale delle temperature, inoltre quasi tutte le stazioni presentano dei missing value, consideriamo che stiamo studiando un fenomeno fisico si è deciso di non andare a sostituire i missing value, questo perché la temperatura varia per via di diversi fattori in gioco e quindi sarebbe necessario uno studio più approfondito in tal senso.

Modifiche effettuate:

  • Upernavik: compare due volte quindi prendiamo quella con più dati ed eliminiamo l'altra ('042100');
  • Qeqertarsuaq: ha meno del 50% di osservazioni, si è deciso quindi di eliminare questa stazione.
In [81]:
metadati.reset_index(inplace = True)
In [82]:
metadati = metadati.drop(labels=[1], axis=0)
dati = dati.drop(labels=['042100'], axis=1)
In [83]:
metadati = metadati.drop(labels=[3], axis=0)
dati = dati.drop(labels=['042180'], axis=1)
In [84]:
metadati = metadati.set_index(["ID"])
In [85]:
x1=dati.index.get_loc('1961-01-31')
x2=dati.index.get_loc('1990-12-31')

# calcolo delle normals, si è fatta la media di tutti i gennaio dal 1960 al 1990 per la stazione 042020 e così via
# per tutti i mesi e per tutte le stazioni
data_normals = dati[x1:x2].groupby(dati[x1:x2].index.month).mean()
print(data_normals)
         042020     042110     042200     042210     042300     042310  \
time                                                                     
1    -23.580000 -17.956667 -13.363333 -13.373333 -12.733333 -18.889286   
2    -25.193333 -20.403333 -15.576667 -14.876667 -13.850000 -19.000000   
3    -23.170000 -20.373333 -16.246667 -15.150000 -13.990000 -17.660714   
4    -16.103333 -13.263333  -9.613333  -8.093333  -7.120000  -7.757143   
5     -4.496667  -3.900000  -1.820000   0.263333  -0.256667   2.675000   
6      2.226667   1.743333   2.683333   4.563333   3.490000   8.539286   
7      5.146667   5.313333   5.723333   7.570000   6.213333  10.789286   
8      4.226667   5.030000   5.320000   6.426667   6.044828   8.625000   
9     -2.180000   0.866667   2.353333   2.613333   3.250000   3.278571   
10   -10.586667  -4.246667  -2.216667  -3.736667  -1.780000  -5.539286   
11   -17.426667  -8.866667  -5.903333  -7.973333  -5.826667 -11.853571   
12   -21.741379 -14.055172  -9.696552 -11.172414  -9.875862 -16.262963   

        042500    042600     042700    042720     043100     043200  \
time                                                                  
1    -7.430000 -6.586667  -6.776667 -5.493333 -30.292000 -23.113333   
2    -7.786667 -6.406667  -6.123333 -5.036667 -30.512500 -24.270000   
3    -7.973333 -5.993333  -5.113333 -4.406667 -29.928000 -23.353333   
4    -3.836667 -2.330000  -0.133333 -0.586667 -23.157692 -17.160000   
5     0.643333  1.426667   5.151724  3.323333  -9.996000  -6.500000   
6     3.920000  3.713333   8.337931  5.193333  -0.630769   0.796667   
7     6.536667  5.553333  10.263333  7.173333   3.300000   3.717241   
8     6.093333  5.310000   9.280000  7.220000   2.124000   2.303571   
9     3.470000  3.573333   5.486667  4.983333  -8.368000  -4.327586   
10   -0.683333  0.120000   0.363333  1.150000 -19.515385 -13.741379   
11   -3.650000 -2.823333  -3.233333 -1.916667 -25.746154 -19.943333   
12   -6.110345 -5.306897  -6.041379 -4.358621 -27.815385 -21.948276   

         043300     043390     043400     043500    043600    043900  
time                                                                  
1    -20.150000 -16.080000 -16.403846 -10.513043 -7.493333 -4.057692  
2    -21.544000 -17.113333 -17.707692 -11.229167 -7.693333 -3.922222  
3    -20.738462 -16.526667 -17.250000 -11.368000 -8.146667 -3.789655  
4    -14.688889 -11.190000 -11.726923  -7.083333 -3.976667 -0.932143  
5     -5.069231  -3.466667  -3.769231  -1.737500  0.673333  1.803448  
6      1.160000   1.123333   0.576000   1.008333  4.180000  4.373333  
7      4.004167   3.348276   2.734615   2.208696  6.393333  6.526667  
8      3.462500   3.514286   3.264000   2.356522  6.003333  6.500000  
9     -2.226087  -0.372414  -0.400000   0.447826  2.993333  4.425000  
10   -10.931818  -6.379310  -6.380769  -2.579167 -0.870000  1.339286  
11   -17.118182 -12.213333 -12.192308  -7.200000 -4.753333 -1.260714  
12   -18.895652 -14.844828 -15.128000  -9.728000 -7.286207 -3.134615  

Ora dobbiamo calcolare le anomalie, e quindi sottrarre mese per mese a tutti i dati della serie temporale il corrispettivo mese delle climate normals stazione per stazione.

In [86]:
data_anom = dati

y0=0
for yi in range(0,int(nyears)):
    data_anom.iloc[y0:y0+12,:] = dati.iloc[y0:y0+12,:]-np.array(data_normals.iloc[:,:])
    y0+=12
In [87]:
fig = plt.figure(figsize=(13,19))

no = 20

subplots = (no,2)
n_panels = subplots[0] * subplots[1]

proj = ccrs.PlateCarree()

ssite = metadati.index

for fi, f in enumerate(ssite):
    
    ax = fig.add_subplot(subplots[0], subplots[1], (fi*2)+1, projection=proj)
    ax.set_title(' - '.join([metadati.nome[ssite[fi]],metadati.paese[ssite[fi]]]))
    ax.stock_img()
    ax.coastlines()
    plt.plot(metadati.lon[ssite[fi]], metadati.lat[ssite[fi]],
        color='red', marker='o', markersize=5,
        transform=ccrs.Geodetic())
    
    tser = fig.add_subplot(subplots[0], subplots[1], (fi+1)*2)
    plt.plot(data_anom[ssite[fi]])
    
fig.tight_layout()
plt.show()

Dai grafici in alto possiamo notare le anomalie di temperatura per ciascuna stazione.

In [88]:
# si crea una griglia utilizzando una risoluzione di 10 gradi
resol = 10
nlon = np.int(360/resol)
nlat = np.int(180/resol)


grlons = np.empty([nlon+1],dtype='float')
grlats = np.empty([nlat+1],dtype='float')
grlons[0] = -180.
grlats[0] = -90.

for i in range(1,nlon+1):
    grlons[i]=grlons[i-1]+resol
for i in range(1,nlat+1):
    grlats[i]=grlats[i-1]+resol
C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\4271657429.py:3: DeprecationWarning: `np.int` is a deprecated alias for the builtin `int`. To silence this warning, use `int` by itself. Doing this will not modify any behavior and is safe. When replacing `np.int`, you may wish to use e.g. `np.int64` or `np.int32` to specify the precision. If you wish to review your current use, check the release note link for additional information.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
  nlon = np.int(360/resol)
C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\4271657429.py:4: DeprecationWarning: `np.int` is a deprecated alias for the builtin `int`. To silence this warning, use `int` by itself. Doing this will not modify any behavior and is safe. When replacing `np.int`, you may wish to use e.g. `np.int64` or `np.int32` to specify the precision. If you wish to review your current use, check the release note link for additional information.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
  nlat = np.int(180/resol)
In [89]:
data_mo = np.empty([nmonths,nlat,nlon],dtype=float)
data_mo[:,:,:] = np.nan

data_yr = np.empty([int(nyears),nlat,nlon],dtype=float)
data_yr[:,:,:] = np.nan
In [90]:
# si va ad analizzare una alla volta in sequenza tutte le celle della griglia creata
for j in range(0,nlat):
    for i in range(0,nlon):
        # si verificano le coordinate delle singole stazioni andando a considerare solo quelle che sono all'interno della cella
        dummy = metadati[metadati.lon >= grlons[i]]
        dummy = dummy[dummy.lon < grlons[i+1]]
        dummy = dummy[dummy.lat >= grlats[j]]
        dummy = dummy[dummy.lat < grlats[j+1]]

        # nel caso dentro la cella sia presente almeno una stazione si eseguono i passaggi successivi

        if (len(dummy.index) > 0):
            
            #calcolo della media delle serie temporali delle stazioni che sono all'interno della cella
            
            data_mo[:,j,i] = np.array(data_anom[dummy.index].mean(axis=1)).flatten()
            
            data_yr[:,j,i] = np.array(data_anom[dummy.index].mean(axis=1).resample("Y").mean()).flatten()
            

            fig = plt.figure(figsize=(13,3))
            subplots = (1,2)
            n_panels = subplots[0] * subplots[1]
            
            ax = fig.add_subplot(subplots[0], subplots[1], 1, projection=ccrs.PlateCarree())
            ax.stock_img()
            ax.coastlines()
            ax.gridlines(xlocs=range(-180,181,resol), ylocs=range(-90,91,resol))
            ax.scatter([dummy.lon],[dummy.lat],color='r',marker='o',s=1.2)
            
            tser = fig.add_subplot(subplots[0], subplots[1], 2)
            plt.plot(asset,np.array(data_anom[dummy.index]),color='grey',linewidth=0.1)
            plt.plot(asset,data_mo[:,j,i],color='blue',linewidth=0.3)
            plt.plot(pd.date_range('1850-01', '2015-01', freq='Y'),data_yr[:,j,i],color='red',linewidth=2)

            fig.tight_layout()
            plt.show()            

Abbiamo ora le anomalie di temperature raggruppate per celle. Per ultimo si devono aggregare tutte le celle nello spazio, in modo da ottenere un'unica serie di anomalie che sia rappresentativa per l'intera Groenlandia.

In [91]:
fig = plt.figure(figsize=(13,13))
subplots = (2,1)

ax = fig.add_subplot(subplots[0], subplots[1], 1, projection=ccrs.PlateCarree())
ax.stock_img()
ax.coastlines()
ax.gridlines(xlocs=range(-180,181,resol), ylocs=range(-90,91,resol))

tser = fig.add_subplot(subplots[0], subplots[1], 2)

data_glob = np.empty([int(nyears)],dtype=float)
data_glob[:] = np.nan

for j in range(0,nlat):
    for i in range(0,nlon):
            
        if not(np.isnan(data_yr[:,j,i]).all()):
         
            ax.plot( grlons[i:i+2].sum()/2, grlats[j:j+2].sum()/2,
                color='red', marker='x', markersize=5,transform=ccrs.PlateCarree())
            plt.plot(pd.date_range('1850-01', '2015-01', freq='Y'),data_yr[:,j,i],color='gray',linewidth=0.1)
    
for t in range(0,int(nyears)):
    local = data_yr[t,:,:]
    valid = np.isnan(data_yr[t,:,:])
    data_glob[t] = np.nanmean(local[~valid])
    
plt.plot(pd.date_range('1850-01', '2015-01', freq='Y'),data_glob[:],color='red',linewidth=2)

plt.axhline(y=0, color='black', linestyle='-')
plt.title('Anomalie di temperatura in Groenlandia')
plt.ylabel('[°C]')

fig.tight_layout()

Anomalie di temperatura della Groenlandia.

In [92]:
df1 = pd.DataFrame(pd.date_range('1850', '2015', freq='Y'), columns = ['anno'])
df1['anno'] = pd.DatetimeIndex(df1['anno']).year
df1['anomGroe'] = data_glob

Seconda parte: creazione dell'AMO Index.¶

Per la creazione dell'AMO Index utilizziamo un modello relativo alle temperature superficiali in gradi C° dei mari di tutto il globo. Il periodo del modello va dal gennaio 1850 sino al dicembre 2014.

L'indice AMO viene calcolato prendendo la media globale annuale delle anomalie di temperature superficiale dei mari escludendo il Nord dell'Atlantico (0°-60°N, 75°W-7.5°W) sottraendola dalla media delle anomalie annuali calcolate esclusivamente sul Nord Atlantico.

In [93]:
modfile='./tos_Omon_MCM-UA-1-0_historical_r1i1p1f2_gn_185001-201412.nc'

Si comincia costruendo un modello che tiene conto dell'intero pianeta tranne il Nord dell'Altantico e poi un modello che tiene conto esclusivamente di quest'ultimo.

In [94]:
d1 = d2 = d3 = d4 = xr.open_dataset(modfile)
d1
Out[94]:
<xarray.Dataset>
Dimensions:    (bnds: 2, longitude: 192, latitude: 80, time: 1980)
Coordinates:
  * bnds       (bnds) float64 1.0 2.0
  * longitude  (longitude) float64 -0.9375 0.9375 2.812 ... 353.4 355.3 357.2
  * latitude   (latitude) float64 -88.63 -86.13 -83.88 ... 83.88 86.13 88.63
  * time       (time) object 1850-01-17 00:00:00 ... 2014-12-17 00:00:00
Data variables:
    lon_bnds   (longitude, bnds) float64 -1.875 0.0 0.0 ... 356.2 356.2 358.1
    lat_bnds   (latitude, bnds) float64 -90.0 -87.26 -87.26 ... 87.26 87.26 90.0
    tos        (time, latitude, longitude) float32 ...
    time_bnds  (time, bnds) object 1850-01-01 12:00:00 ... 2015-01-01 12:00:00
Attributes: (12/45)
    source:                 Manabe Climate Model at U of Arizona             ...
    activity_id:            CMIP
    branch_method:          standard
    Conventions:            CF-1.7 CMIP-6.2 
    creation_date:          2019-08-14T00:00:00Z
    data_specs_version:     01.00.28
    ...                     ...
    sub_experiment:         none
    sub_experiment_id:      none
    title:                  UArizona MCM-UA-1-0 historical          
    product:                model-output
    institution:            Department of Geosciences, University of Arizona,...
    license:                CMIP6 model data produced by the U of Arizona is ...
xarray.Dataset
    • bnds: 2
    • longitude: 192
    • latitude: 80
    • time: 1980
    • bnds
      (bnds)
      float64
      1.0 2.0
      long_name :
      vertex number
      array([1., 2.])
    • longitude
      (longitude)
      float64
      -0.9375 0.9375 ... 355.3 357.2
      long_name :
      longitude
      units :
      degrees_east
      cell_methods :
      time: point
      axis :
      X
      standard_name :
      longitude
      bounds :
      lon_bnds
      array([ -0.9375,   0.9375,   2.8125,   4.6875,   6.5625,   8.4375,  10.3125,
              12.1875,  14.0625,  15.9375,  17.8125,  19.6875,  21.5625,  23.4375,
              25.3125,  27.1875,  29.0625,  30.9375,  32.8125,  34.6875,  36.5625,
              38.4375,  40.3125,  42.1875,  44.0625,  45.9375,  47.8125,  49.6875,
              51.5625,  53.4375,  55.3125,  57.1875,  59.0625,  60.9375,  62.8125,
              64.6875,  66.5625,  68.4375,  70.3125,  72.1875,  74.0625,  75.9375,
              77.8125,  79.6875,  81.5625,  83.4375,  85.3125,  87.1875,  89.0625,
              90.9375,  92.8125,  94.6875,  96.5625,  98.4375, 100.3125, 102.1875,
             104.0625, 105.9375, 107.8125, 109.6875, 111.5625, 113.4375, 115.3125,
             117.1875, 119.0625, 120.9375, 122.8125, 124.6875, 126.5625, 128.4375,
             130.3125, 132.1875, 134.0625, 135.9375, 137.8125, 139.6875, 141.5625,
             143.4375, 145.3125, 147.1875, 149.0625, 150.9375, 152.8125, 154.6875,
             156.5625, 158.4375, 160.3125, 162.1875, 164.0625, 165.9375, 167.8125,
             169.6875, 171.5625, 173.4375, 175.3125, 177.1875, 179.0625, 180.9375,
             182.8125, 184.6875, 186.5625, 188.4375, 190.3125, 192.1875, 194.0625,
             195.9375, 197.8125, 199.6875, 201.5625, 203.4375, 205.3125, 207.1875,
             209.0625, 210.9375, 212.8125, 214.6875, 216.5625, 218.4375, 220.3125,
             222.1875, 224.0625, 225.9375, 227.8125, 229.6875, 231.5625, 233.4375,
             235.3125, 237.1875, 239.0625, 240.9375, 242.8125, 244.6875, 246.5625,
             248.4375, 250.3125, 252.1875, 254.0625, 255.9375, 257.8125, 259.6875,
             261.5625, 263.4375, 265.3125, 267.1875, 269.0625, 270.9375, 272.8125,
             274.6875, 276.5625, 278.4375, 280.3125, 282.1875, 284.0625, 285.9375,
             287.8125, 289.6875, 291.5625, 293.4375, 295.3125, 297.1875, 299.0625,
             300.9375, 302.8125, 304.6875, 306.5625, 308.4375, 310.3125, 312.1875,
             314.0625, 315.9375, 317.8125, 319.6875, 321.5625, 323.4375, 325.3125,
             327.1875, 329.0625, 330.9375, 332.8125, 334.6875, 336.5625, 338.4375,
             340.3125, 342.1875, 344.0625, 345.9375, 347.8125, 349.6875, 351.5625,
             353.4375, 355.3125, 357.1875])
    • latitude
      (latitude)
      float64
      -88.63 -86.13 ... 86.13 88.63
      long_name :
      latitude
      units :
      degrees_north
      cell_methods :
      time: point
      axis :
      Y
      standard_name :
      latitude
      bounds :
      lat_bnds
      array([-88.628983, -86.127939, -83.875374, -81.632443, -79.392556, -77.154046,
             -74.916278, -72.678959, -70.441932, -68.205105, -65.968422, -63.731846,
             -61.495352, -59.25892 , -57.022539, -54.786199, -52.549892, -50.313612,
             -48.077355, -45.841118, -43.604897, -41.36869 , -39.132495, -36.896311,
             -34.660136, -32.423969, -30.187808, -27.951654, -25.715505, -23.479361,
             -21.24322 , -19.007083, -16.770949, -14.534818, -12.298689, -10.062561,
              -7.826435,  -5.59031 ,  -3.354186,  -1.118062,   1.118062,   3.354186,
               5.59031 ,   7.826435,  10.062561,  12.298689,  14.534818,  16.770949,
              19.007083,  21.24322 ,  23.479361,  25.715505,  27.951654,  30.187808,
              32.423969,  34.660136,  36.896311,  39.132495,  41.36869 ,  43.604897,
              45.841118,  48.077355,  50.313612,  52.549892,  54.786199,  57.022539,
              59.25892 ,  61.495352,  63.731846,  65.968422,  68.205105,  70.441932,
              72.678959,  74.916278,  77.154046,  79.392556,  81.632443,  83.875374,
              86.127939,  88.628983])
    • time
      (time)
      object
      1850-01-17 00:00:00 ... 2014-12-...
      axis :
      T
      long_name :
      time
      calendar_type :
      noleap
      bounds :
      time_bnds
      description :
      for time-mean fields
      standard_name :
      time
      array([cftime.DatetimeNoLeap(1850, 1, 17, 0, 0, 0, 0, has_year_zero=True),
             cftime.DatetimeNoLeap(1850, 2, 15, 12, 0, 0, 0, has_year_zero=True),
             cftime.DatetimeNoLeap(1850, 3, 17, 0, 0, 0, 0, has_year_zero=True), ...,
             cftime.DatetimeNoLeap(2014, 10, 17, 0, 0, 0, 0, has_year_zero=True),
             cftime.DatetimeNoLeap(2014, 11, 16, 12, 0, 0, 0, has_year_zero=True),
             cftime.DatetimeNoLeap(2014, 12, 17, 0, 0, 0, 0, has_year_zero=True)],
            dtype=object)
    • lon_bnds
      (longitude, bnds)
      float64
      ...
      long_name :
      longitude bounds
      units :
      degrees_east
      axis :
      X
      array([[ -1.875,   0.   ],
             [  0.   ,   1.875],
             [  1.875,   3.75 ],
             ...,
             [352.5  , 354.375],
             [354.375, 356.25 ],
             [356.25 , 358.125]])
    • lat_bnds
      (latitude, bnds)
      float64
      ...
      long_name :
      latitude bounds
      units :
      degrees_north
      axis :
      Y
      array([[-89.999996, -87.257969],
             [-87.257969, -84.997909],
             [-84.997909, -82.75284 ],
             [-82.75284 , -80.512046],
             [-80.512046, -78.273066],
             [-78.273066, -76.035025],
             [-76.035025, -73.797531],
             [-73.797531, -71.560387],
             [-71.560387, -69.323477],
             [-69.323477, -67.086733],
             [-67.086733, -64.850111],
             [-64.850111, -62.613581],
             [-62.613581, -60.377122],
             [-60.377122, -58.140719],
             [-58.140719, -55.90436 ],
             [-55.90436 , -53.668038],
             [-53.668038, -51.431745],
             [-51.431745, -49.195478],
             [-49.195478, -46.959232],
             [-46.959232, -44.723003],
             [-44.723003, -42.48679 ],
             [-42.48679 , -40.25059 ],
             [-40.25059 , -38.014401],
             [-38.014401, -35.778221],
             [-35.778221, -33.54205 ],
             [-33.54205 , -31.305887],
             [-31.305887, -29.06973 ],
             [-29.06973 , -26.833578],
             [-26.833578, -24.597432],
             [-24.597432, -22.36129 ],
             [-22.36129 , -20.125151],
             [-20.125151, -17.889016],
             [-17.889016, -15.652883],
             [-15.652883, -13.416753],
             [-13.416753, -11.180624],
             [-11.180624,  -8.944498],
             [ -8.944498,  -6.708372],
             [ -6.708372,  -4.472248],
             [ -4.472248,  -2.236124],
             [ -2.236124,   0.      ],
             [  0.      ,   2.236124],
             [  2.236124,   4.472248],
             [  4.472248,   6.708372],
             [  6.708372,   8.944498],
             [  8.944498,  11.180624],
             [ 11.180624,  13.416753],
             [ 13.416753,  15.652883],
             [ 15.652883,  17.889016],
             [ 17.889016,  20.125151],
             [ 20.125151,  22.36129 ],
             [ 22.36129 ,  24.597432],
             [ 24.597432,  26.833578],
             [ 26.833578,  29.06973 ],
             [ 29.06973 ,  31.305887],
             [ 31.305887,  33.54205 ],
             [ 33.54205 ,  35.778221],
             [ 35.778221,  38.014401],
             [ 38.014401,  40.25059 ],
             [ 40.25059 ,  42.48679 ],
             [ 42.48679 ,  44.723003],
             [ 44.723003,  46.959232],
             [ 46.959232,  49.195478],
             [ 49.195478,  51.431745],
             [ 51.431745,  53.668038],
             [ 53.668038,  55.90436 ],
             [ 55.90436 ,  58.140719],
             [ 58.140719,  60.377122],
             [ 60.377122,  62.613581],
             [ 62.613581,  64.850111],
             [ 64.850111,  67.086733],
             [ 67.086733,  69.323477],
             [ 69.323477,  71.560387],
             [ 71.560387,  73.797531],
             [ 73.797531,  76.035025],
             [ 76.035025,  78.273066],
             [ 78.273066,  80.512046],
             [ 80.512046,  82.75284 ],
             [ 82.75284 ,  84.997909],
             [ 84.997909,  87.257969],
             [ 87.257969,  89.999996]])
    • tos
      (time, latitude, longitude)
      float32
      ...
      units :
      degC
      long_name :
      Sea Surface Temperature
      cell_methods :
      area: mean where sea time: mean
      cell_measures :
      area: areacello
      table_id :
      Omon
      standard_name :
      sea_surface_temperature
      [30412800 values with dtype=float32]
    • time_bnds
      (time, bnds)
      object
      ...
      axis :
      T
      long_name :
      time axis boundaries
      FillValue :
      1e+20
      array([[cftime.DatetimeNoLeap(1850, 1, 1, 12, 0, 0, 0, has_year_zero=True),
              cftime.DatetimeNoLeap(1850, 2, 1, 12, 0, 0, 0, has_year_zero=True)],
             [cftime.DatetimeNoLeap(1850, 2, 1, 12, 0, 0, 0, has_year_zero=True),
              cftime.DatetimeNoLeap(1850, 3, 1, 12, 0, 0, 0, has_year_zero=True)],
             [cftime.DatetimeNoLeap(1850, 3, 1, 12, 0, 0, 0, has_year_zero=True),
              cftime.DatetimeNoLeap(1850, 4, 1, 12, 0, 0, 0, has_year_zero=True)],
             ...,
             [cftime.DatetimeNoLeap(2014, 10, 1, 12, 0, 0, 0, has_year_zero=True),
              cftime.DatetimeNoLeap(2014, 11, 1, 12, 0, 0, 0, has_year_zero=True)],
             [cftime.DatetimeNoLeap(2014, 11, 1, 12, 0, 0, 0, has_year_zero=True),
              cftime.DatetimeNoLeap(2014, 12, 1, 12, 0, 0, 0, has_year_zero=True)],
             [cftime.DatetimeNoLeap(2014, 12, 1, 12, 0, 0, 0, has_year_zero=True),
              cftime.DatetimeNoLeap(2015, 1, 1, 12, 0, 0, 0, has_year_zero=True)]],
            dtype=object)
  • source :
    Manabe Climate Model at U of Arizona
    activity_id :
    CMIP
    branch_method :
    standard
    Conventions :
    CF-1.7 CMIP-6.2
    creation_date :
    2019-08-14T00:00:00Z
    data_specs_version :
    01.00.28
    experiment :
    all-forcing simulation of the recent past
    experiment_id :
    historical
    frequency :
    mon
    further_info_url :
    https://furtherinfo.es-doc.org/CMIP6.UA.MCM-UA-1-0.historical.none.r1i1p1f2
    grid :
    lat-lon
    grid_label :
    gn
    initialization_index :
    0
    institution_id :
    UA
    mip_era :
    CMIP6
    nominal_resolution :
    250 km
    parent_activity_id :
    CMIP
    parent_experiment_id :
    piControl
    parent_mip_era :
    CMIP6
    parent_source_id :
    MCM-UA-1-0
    parent_time_units :
    days since 0001-01-01 (noleap)
    physics_index :
    0
    realization_index :
    0
    realm :
    ocean
    references :
    Delworth, Thomas L., Ronald J Stouffer, Keith W Dixon, Michael JSpelman, Thomas R Knutson, Anthony J Broccoli, P J Kushner, andRichard T Wetherald, 2002: Review of simulations of climatevariability and change with the GFDL R30 coupled climate model.Climate Dynamics, 19(7), 555-574....and references therein.
    source_id :
    MCM-UA-1-0
    source_type :
    AOGCM
    tracking_id :
    hdl:21.14100/fe10c9a4-7858-4768-82c0-8b034955e3e3
    external_variables :
    areacello
    branch_time_in_child :
    1533000.0
    branch_time_in_parent :
    0.0
    table_id :
    Omon
    history :
    File was produced by a FORTRAN program
    contact :
    GEOS-CMIP@email.arizona.edu
    comment :
    Used level closest to surface
    forcing_index :
    2
    variant_label :
    r1i1p1f2
    variant_info :
    1850 to 2014 time varying equiv CO2, solar, volcano and aerosolforcing
    variable_id :
    tos
    sub_experiment :
    none
    sub_experiment_id :
    none
    title :
    UArizona MCM-UA-1-0 historical
    product :
    model-output
    institution :
    Department of Geosciences, University of Arizona, Tucson, AZ 85721, USA
    license :
    CMIP6 model data produced by the U of Arizona is licensed under a Creative Commons Attribution-[*]ShareAlike 4.0 International License (https://creativecommons.org/licenses/). Consult https://pcmdi.llnl.gov/CMIP6/TermsOfUse for terms of use governing CMIP6 output, including citation requirements and proper acknowledgment. Further information about this data, including some limitations, can be found via the further_info_url (recorded as a global attribute in this file). The data producers and data providers make no warranty, either express or implied, including, but not limited to, warranties of merchantability and fitness for a particular purpose. All liabilities arising from the supply of the information (including any liability arising in negligence) are excluded to the fullest extent permitted by law.
In [95]:
# conversione in datetime64
datetimeindex = d1.indexes['time'].to_datetimeindex()
datetimeindex = d2.indexes['time'].to_datetimeindex()
datetimeindex = d3.indexes['time'].to_datetimeindex()
datetimeindex = d4.indexes['time'].to_datetimeindex()
d1['time'] = datetimeindex
d2['time'] = datetimeindex
d3['time'] = datetimeindex
d4['time'] = datetimeindex
C:\Users\fadda\anaconda3\envs\time_series\lib\site-packages\xarray\coding\times.py:360: FutureWarning: Index.ravel returning ndarray is deprecated; in a future version this will return a view on self.
  sample = dates.ravel()[0]
C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\4293979708.py:2: RuntimeWarning: Converting a CFTimeIndex with dates from a non-standard calendar, 'noleap', to a pandas.DatetimeIndex, which uses dates from the standard calendar.  This may lead to subtle errors in operations that depend on the length of time between dates.
  datetimeindex = d1.indexes['time'].to_datetimeindex()
C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\4293979708.py:3: RuntimeWarning: Converting a CFTimeIndex with dates from a non-standard calendar, 'noleap', to a pandas.DatetimeIndex, which uses dates from the standard calendar.  This may lead to subtle errors in operations that depend on the length of time between dates.
  datetimeindex = d2.indexes['time'].to_datetimeindex()
C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\4293979708.py:4: RuntimeWarning: Converting a CFTimeIndex with dates from a non-standard calendar, 'noleap', to a pandas.DatetimeIndex, which uses dates from the standard calendar.  This may lead to subtle errors in operations that depend on the length of time between dates.
  datetimeindex = d3.indexes['time'].to_datetimeindex()
C:\Users\fadda\AppData\Local\Temp\ipykernel_15492\4293979708.py:5: RuntimeWarning: Converting a CFTimeIndex with dates from a non-standard calendar, 'noleap', to a pandas.DatetimeIndex, which uses dates from the standard calendar.  This may lead to subtle errors in operations that depend on the length of time between dates.
  datetimeindex = d4.indexes['time'].to_datetimeindex()

Costruzione del modello relativo alla parte Nord Atlantica

In [96]:
modelloAmo = d1.where((d1.latitude > 0) & (d1.latitude < 60) & (d1.longitude > 285) & (d1.longitude < 352.5), drop=True)
In [97]:
p0 = modelloAmo.tos.mean('time').plot(transform=ccrs.PlateCarree(),subplot_kws={'projection': ccrs.Robinson()},
                            cmap='jet',extend='both')
p0.axes.coastlines()
p0.axes.gridlines()
Out[97]:
<cartopy.mpl.gridliner.Gridliner at 0x2124d8bbd30>

Costruzione del modello escludendo l'Atlantico del Nord

In [98]:
modello2 = d2.where((d2.latitude < 90) & (d2.longitude < 352.5) & (d2.longitude <= 285)  , drop=False)
In [99]:
p0 = modello2.tos.mean('time').plot(transform=ccrs.PlateCarree(),subplot_kws={'projection': ccrs.Robinson()},
                            cmap='jet',extend='both')
p0.axes.coastlines()
p0.axes.gridlines()
Out[99]:
<cartopy.mpl.gridliner.Gridliner at 0x212513d9e50>
In [100]:
modello3 = d3.where((d1.latitude < 0) & (d3.longitude > 290) & (d3.longitude < 360), drop=False)
In [101]:
p0 = modello3.tos.mean('time').plot(transform=ccrs.PlateCarree(),subplot_kws={'projection': ccrs.Robinson()},
                            cmap='jet',extend='both')
p0.axes.coastlines()
p0.axes.gridlines()
Out[101]:
<cartopy.mpl.gridliner.Gridliner at 0x2124f851130>
In [102]:
modello4 = d4.where((d1.latitude >= 60), drop=False)
In [103]:
p0 = modello4.tos.mean('time').plot(transform=ccrs.PlateCarree(),subplot_kws={'projection': ccrs.Robinson()},
                            cmap='jet',extend='both')
p0.axes.coastlines()
p0.axes.gridlines()
Out[103]:
<cartopy.mpl.gridliner.Gridliner at 0x2124d917ee0>

Si effettua un merge del modello2, modello3 e modello4, in modo da ricostruire il modello di interesse per noi.

In [104]:
modelloNoAtl = xr.merge([modello2,modello3,modello4])
In [105]:
p0 = modelloNoAtl.tos.mean('time').plot(transform=ccrs.PlateCarree(),subplot_kws={'projection': ccrs.Robinson()},
                            cmap='jet',extend='both')
p0.axes.coastlines()
p0.axes.gridlines()
Out[105]:
<cartopy.mpl.gridliner.Gridliner at 0x2124fdfe550>

Dopo aver costruito i due modelli andiamo a calcolare le anomalie di temperatura:

In [106]:
# Calcoliamo le normals prendendo il periodo di riferimento, raggruppando per mese e andando a effettuare la media

normalNoAtl = modelloNoAtl.tos.sel(time=slice('1850-01', '1890-12-31')).groupby('time.month').mean()
normalAmo = modelloAmo.tos.sel(time=slice('1850-01', '1890-12-31')).groupby('time.month').mean()
In [107]:
# Le anomalie vengono calcolate raggruppando per mese le temperature assolute e poi sottraendo le normals

anomalieNoAtl = modelloNoAtl.tos.groupby('time.month') - normalNoAtl
anomalieAmo = modelloAmo.tos.groupby('time.month') - normalAmo
In [108]:
# modifica dei pesi
# Visto che le aree delle celle diminuiscono con la latitudine si vanno a creare dei pesi proporzionali alle celle
# in modo da poter creare una media pesata

pesiNoAtl = np.cos(np.deg2rad(modelloNoAtl.latitude))
pesiAmo = np.cos(np.deg2rad(modelloAmo.latitude))
In [109]:
# Il modello completo conterrà le anomalie di temperature dell'Atlantico del Nord e del globo intero escludendo il Nord Atlantico.
# Da una risoluzione mensile passiamo ad una risoluzione annuale, aggregando poi anche nello spazio.

modelloCompleto = xr.Dataset()
modelloCompleto['tNoAtl'] = anomalieNoAtl.weighted(pesiNoAtl).mean(('longitude', 'latitude'), keep_attrs=True).resample(time='Y').mean()
modelloCompleto['tAmo'] = anomalieAmo.weighted(pesiAmo).mean(('longitude', 'latitude'), keep_attrs=True).resample(time='Y').mean()

Seguendo la definizione dell'indice AMO, si va quindi a sottrarre alle anomalie dell'Atlantico quelle relative a tutto il globo escludendo il Nord Atlantico

In [110]:
totale = modelloCompleto.tAmo - modelloCompleto.tNoAtl
totale
Out[110]:
<xarray.DataArray (time: 165)>
array([-0.07263093, -0.22366853, -0.28658982, -0.0055107 , -0.22068568,
       -0.15473103, -0.26443282, -0.14798923, -0.21817076, -0.221023  ,
       -0.20454415, -0.11603901,  0.06204571,  0.15235264,  0.01780326,
       -0.04170245, -0.18067771, -0.08880673,  0.13044599, -0.21868026,
        0.06800493,  0.19949009,  0.16899327,  0.56415666,  0.21012964,
        0.04514276, -0.09230632,  0.11883762,  0.14837877,  0.16980886,
       -0.11246507,  0.18595026,  0.18903891,  0.14902244,  0.02561967,
       -0.11485682, -0.05632997,  0.18056069, -0.15458829,  0.17137712,
        0.23927074,  0.04803847,  0.03627353,  0.2803443 ,  0.11586739,
       -0.01859957, -0.08989961, -0.32751925,  0.00724594,  0.00996242,
        0.17878007,  0.10384548,  0.10215164,  0.0776101 ,  0.01139102,
        0.01677712,  0.06992126,  0.15792389,  0.11136409,  0.11155269,
        0.040364  , -0.1133489 , -0.3970292 , -0.23458152, -0.22976045,
        0.09084673, -0.10571517, -0.00772636,  0.0859891 , -0.00574517,
        0.04027676,  0.08953073, -0.11436532, -0.20162388, -0.24563517,
        0.04430177,  0.14587949, -0.08706278,  0.0955557 ,  0.14624774,
       -0.11901757,  0.06409697, -0.01800325,  0.04904631,  0.01416419,
        0.18627258,  0.0519064 , -0.05746475, -0.04584225,  0.14548027,
        0.02536201,  0.04335942,  0.16644177, -0.19813158, -0.07018882,
       -0.21742252, -0.07218918, -0.16799216, -0.32103125, -0.12199013,
        0.02659784, -0.22993997,  0.13123566, -0.14043248, -0.05923033,
        0.07452597,  0.01401333,  0.10343263,  0.00709666, -0.1789199 ,
       -0.05340617, -0.34674803, -0.3652285 , -0.23773795, -0.11978361,
       -0.44787504, -0.32234429, -0.3513176 , -0.12119485,  0.04117268,
       -0.0026219 , -0.21733221, -0.23179569, -0.06486765, -0.04235908,
        0.15465302,  0.06245346, -0.10211512, -0.18674023, -0.20051849,
       -0.33691783, -0.19193922, -0.26459378, -0.3018955 , -0.30659096,
       -0.20916768,  0.01497165, -0.00269398, -0.0480597 ,  0.16328518,
        0.21509416,  0.01457634,  0.02589139, -0.03650928,  0.06242263,
        0.26190217,  0.12469045,  0.16163021,  0.13165462,  0.12400276,
        0.15535545, -0.08205654, -0.1053201 , -0.19202144, -0.01987248,
        0.20346015,  0.07531302,  0.08818656,  0.13871626, -0.10674908,
        0.18720088, -0.08335551, -0.31027143, -0.34111918, -0.34566067])
Coordinates:
  * time     (time) datetime64[ns] 1850-12-31 1851-12-31 ... 2014-12-31
xarray.DataArray
  • time: 165
  • -0.07263 -0.2237 -0.2866 -0.005511 ... -0.3103 -0.3411 -0.3457
    array([-0.07263093, -0.22366853, -0.28658982, -0.0055107 , -0.22068568,
           -0.15473103, -0.26443282, -0.14798923, -0.21817076, -0.221023  ,
           -0.20454415, -0.11603901,  0.06204571,  0.15235264,  0.01780326,
           -0.04170245, -0.18067771, -0.08880673,  0.13044599, -0.21868026,
            0.06800493,  0.19949009,  0.16899327,  0.56415666,  0.21012964,
            0.04514276, -0.09230632,  0.11883762,  0.14837877,  0.16980886,
           -0.11246507,  0.18595026,  0.18903891,  0.14902244,  0.02561967,
           -0.11485682, -0.05632997,  0.18056069, -0.15458829,  0.17137712,
            0.23927074,  0.04803847,  0.03627353,  0.2803443 ,  0.11586739,
           -0.01859957, -0.08989961, -0.32751925,  0.00724594,  0.00996242,
            0.17878007,  0.10384548,  0.10215164,  0.0776101 ,  0.01139102,
            0.01677712,  0.06992126,  0.15792389,  0.11136409,  0.11155269,
            0.040364  , -0.1133489 , -0.3970292 , -0.23458152, -0.22976045,
            0.09084673, -0.10571517, -0.00772636,  0.0859891 , -0.00574517,
            0.04027676,  0.08953073, -0.11436532, -0.20162388, -0.24563517,
            0.04430177,  0.14587949, -0.08706278,  0.0955557 ,  0.14624774,
           -0.11901757,  0.06409697, -0.01800325,  0.04904631,  0.01416419,
            0.18627258,  0.0519064 , -0.05746475, -0.04584225,  0.14548027,
            0.02536201,  0.04335942,  0.16644177, -0.19813158, -0.07018882,
           -0.21742252, -0.07218918, -0.16799216, -0.32103125, -0.12199013,
            0.02659784, -0.22993997,  0.13123566, -0.14043248, -0.05923033,
            0.07452597,  0.01401333,  0.10343263,  0.00709666, -0.1789199 ,
           -0.05340617, -0.34674803, -0.3652285 , -0.23773795, -0.11978361,
           -0.44787504, -0.32234429, -0.3513176 , -0.12119485,  0.04117268,
           -0.0026219 , -0.21733221, -0.23179569, -0.06486765, -0.04235908,
            0.15465302,  0.06245346, -0.10211512, -0.18674023, -0.20051849,
           -0.33691783, -0.19193922, -0.26459378, -0.3018955 , -0.30659096,
           -0.20916768,  0.01497165, -0.00269398, -0.0480597 ,  0.16328518,
            0.21509416,  0.01457634,  0.02589139, -0.03650928,  0.06242263,
            0.26190217,  0.12469045,  0.16163021,  0.13165462,  0.12400276,
            0.15535545, -0.08205654, -0.1053201 , -0.19202144, -0.01987248,
            0.20346015,  0.07531302,  0.08818656,  0.13871626, -0.10674908,
            0.18720088, -0.08335551, -0.31027143, -0.34111918, -0.34566067])
    • time
      (time)
      datetime64[ns]
      1850-12-31 ... 2014-12-31
      array(['1850-12-31T00:00:00.000000000', '1851-12-31T00:00:00.000000000',
             '1852-12-31T00:00:00.000000000', '1853-12-31T00:00:00.000000000',
             '1854-12-31T00:00:00.000000000', '1855-12-31T00:00:00.000000000',
             '1856-12-31T00:00:00.000000000', '1857-12-31T00:00:00.000000000',
             '1858-12-31T00:00:00.000000000', '1859-12-31T00:00:00.000000000',
             '1860-12-31T00:00:00.000000000', '1861-12-31T00:00:00.000000000',
             '1862-12-31T00:00:00.000000000', '1863-12-31T00:00:00.000000000',
             '1864-12-31T00:00:00.000000000', '1865-12-31T00:00:00.000000000',
             '1866-12-31T00:00:00.000000000', '1867-12-31T00:00:00.000000000',
             '1868-12-31T00:00:00.000000000', '1869-12-31T00:00:00.000000000',
             '1870-12-31T00:00:00.000000000', '1871-12-31T00:00:00.000000000',
             '1872-12-31T00:00:00.000000000', '1873-12-31T00:00:00.000000000',
             '1874-12-31T00:00:00.000000000', '1875-12-31T00:00:00.000000000',
             '1876-12-31T00:00:00.000000000', '1877-12-31T00:00:00.000000000',
             '1878-12-31T00:00:00.000000000', '1879-12-31T00:00:00.000000000',
             '1880-12-31T00:00:00.000000000', '1881-12-31T00:00:00.000000000',
             '1882-12-31T00:00:00.000000000', '1883-12-31T00:00:00.000000000',
             '1884-12-31T00:00:00.000000000', '1885-12-31T00:00:00.000000000',
             '1886-12-31T00:00:00.000000000', '1887-12-31T00:00:00.000000000',
             '1888-12-31T00:00:00.000000000', '1889-12-31T00:00:00.000000000',
             '1890-12-31T00:00:00.000000000', '1891-12-31T00:00:00.000000000',
             '1892-12-31T00:00:00.000000000', '1893-12-31T00:00:00.000000000',
             '1894-12-31T00:00:00.000000000', '1895-12-31T00:00:00.000000000',
             '1896-12-31T00:00:00.000000000', '1897-12-31T00:00:00.000000000',
             '1898-12-31T00:00:00.000000000', '1899-12-31T00:00:00.000000000',
             '1900-12-31T00:00:00.000000000', '1901-12-31T00:00:00.000000000',
             '1902-12-31T00:00:00.000000000', '1903-12-31T00:00:00.000000000',
             '1904-12-31T00:00:00.000000000', '1905-12-31T00:00:00.000000000',
             '1906-12-31T00:00:00.000000000', '1907-12-31T00:00:00.000000000',
             '1908-12-31T00:00:00.000000000', '1909-12-31T00:00:00.000000000',
             '1910-12-31T00:00:00.000000000', '1911-12-31T00:00:00.000000000',
             '1912-12-31T00:00:00.000000000', '1913-12-31T00:00:00.000000000',
             '1914-12-31T00:00:00.000000000', '1915-12-31T00:00:00.000000000',
             '1916-12-31T00:00:00.000000000', '1917-12-31T00:00:00.000000000',
             '1918-12-31T00:00:00.000000000', '1919-12-31T00:00:00.000000000',
             '1920-12-31T00:00:00.000000000', '1921-12-31T00:00:00.000000000',
             '1922-12-31T00:00:00.000000000', '1923-12-31T00:00:00.000000000',
             '1924-12-31T00:00:00.000000000', '1925-12-31T00:00:00.000000000',
             '1926-12-31T00:00:00.000000000', '1927-12-31T00:00:00.000000000',
             '1928-12-31T00:00:00.000000000', '1929-12-31T00:00:00.000000000',
             '1930-12-31T00:00:00.000000000', '1931-12-31T00:00:00.000000000',
             '1932-12-31T00:00:00.000000000', '1933-12-31T00:00:00.000000000',
             '1934-12-31T00:00:00.000000000', '1935-12-31T00:00:00.000000000',
             '1936-12-31T00:00:00.000000000', '1937-12-31T00:00:00.000000000',
             '1938-12-31T00:00:00.000000000', '1939-12-31T00:00:00.000000000',
             '1940-12-31T00:00:00.000000000', '1941-12-31T00:00:00.000000000',
             '1942-12-31T00:00:00.000000000', '1943-12-31T00:00:00.000000000',
             '1944-12-31T00:00:00.000000000', '1945-12-31T00:00:00.000000000',
             '1946-12-31T00:00:00.000000000', '1947-12-31T00:00:00.000000000',
             '1948-12-31T00:00:00.000000000', '1949-12-31T00:00:00.000000000',
             '1950-12-31T00:00:00.000000000', '1951-12-31T00:00:00.000000000',
             '1952-12-31T00:00:00.000000000', '1953-12-31T00:00:00.000000000',
             '1954-12-31T00:00:00.000000000', '1955-12-31T00:00:00.000000000',
             '1956-12-31T00:00:00.000000000', '1957-12-31T00:00:00.000000000',
             '1958-12-31T00:00:00.000000000', '1959-12-31T00:00:00.000000000',
             '1960-12-31T00:00:00.000000000', '1961-12-31T00:00:00.000000000',
             '1962-12-31T00:00:00.000000000', '1963-12-31T00:00:00.000000000',
             '1964-12-31T00:00:00.000000000', '1965-12-31T00:00:00.000000000',
             '1966-12-31T00:00:00.000000000', '1967-12-31T00:00:00.000000000',
             '1968-12-31T00:00:00.000000000', '1969-12-31T00:00:00.000000000',
             '1970-12-31T00:00:00.000000000', '1971-12-31T00:00:00.000000000',
             '1972-12-31T00:00:00.000000000', '1973-12-31T00:00:00.000000000',
             '1974-12-31T00:00:00.000000000', '1975-12-31T00:00:00.000000000',
             '1976-12-31T00:00:00.000000000', '1977-12-31T00:00:00.000000000',
             '1978-12-31T00:00:00.000000000', '1979-12-31T00:00:00.000000000',
             '1980-12-31T00:00:00.000000000', '1981-12-31T00:00:00.000000000',
             '1982-12-31T00:00:00.000000000', '1983-12-31T00:00:00.000000000',
             '1984-12-31T00:00:00.000000000', '1985-12-31T00:00:00.000000000',
             '1986-12-31T00:00:00.000000000', '1987-12-31T00:00:00.000000000',
             '1988-12-31T00:00:00.000000000', '1989-12-31T00:00:00.000000000',
             '1990-12-31T00:00:00.000000000', '1991-12-31T00:00:00.000000000',
             '1992-12-31T00:00:00.000000000', '1993-12-31T00:00:00.000000000',
             '1994-12-31T00:00:00.000000000', '1995-12-31T00:00:00.000000000',
             '1996-12-31T00:00:00.000000000', '1997-12-31T00:00:00.000000000',
             '1998-12-31T00:00:00.000000000', '1999-12-31T00:00:00.000000000',
             '2000-12-31T00:00:00.000000000', '2001-12-31T00:00:00.000000000',
             '2002-12-31T00:00:00.000000000', '2003-12-31T00:00:00.000000000',
             '2004-12-31T00:00:00.000000000', '2005-12-31T00:00:00.000000000',
             '2006-12-31T00:00:00.000000000', '2007-12-31T00:00:00.000000000',
             '2008-12-31T00:00:00.000000000', '2009-12-31T00:00:00.000000000',
             '2010-12-31T00:00:00.000000000', '2011-12-31T00:00:00.000000000',
             '2012-12-31T00:00:00.000000000', '2013-12-31T00:00:00.000000000',
             '2014-12-31T00:00:00.000000000'], dtype='datetime64[ns]')

Costruiamo un dataframe che conterrà anno per anno il valore dell'AMO Index.

In [111]:
df2 = pd.DataFrame(pd.date_range('1850', '2015', freq='Y'), columns = ['anno'])
df2['anno'] = pd.DatetimeIndex(df2['anno']).year
df2['anomAmo'] = totale
df2 = df2.set_index('anno')
In [112]:
plt.figure(figsize=(13,8))
x = df2.anomAmo
y = np.linspace(1860, 2020, 165)
plt.axhline(y=0, color='black', linestyle='-')

plt.title('Oscillazione multidecennale atlantica')
plt.ylabel('AMO Index')

plt.fill_between(y, x, where=x>0, color='orange')
plt.fill_between(y, x, where=x<0, color='blue')

plt.show()

Nel grafico sopra si può osservare l'andamento di questo indice, e quindi le sue oscillazioni sia negative che positive dal 1860 al 2014.

In basso vediamo come dovrebbe risultare l'AMO Index.

In [113]:
df2.reset_index(inplace = True)
In [114]:
finale = df1.merge(df2, how='left')
In [115]:
finale = finale.set_index('anno')
In [116]:
plt.figure(figsize=(12,5))

ax3 = finale.anomAmo.rolling(window =15).mean().plot(color='green', grid=True, label='AMO Index')

ax4 = finale.anomGroe.rolling(window =15).mean().plot(color='red', grid=True, secondary_y=True, label='Anomalie Groenlandia')

plt.title('Confronto tra le anomalie di temperatura in Groenlandia e AMO Index')
plt.xlabel('Anno')
ax3.set_ylabel('AMO Index', color='green')
ax4.set_ylabel('Anomalie Groenlandia [°C]', color='red')

h1, l1 = ax3.get_legend_handles_labels()
h2, l2 = ax4.get_legend_handles_labels()

plt.show()

Nel grafico sopra sono state plottate due serie storiche: in verde l'AMO Index e in rosso le anomalie di temperatura nella Groenlandia, per una migliore visualizzazione si è usata una media mobile per entrambe le serie.

Osservando il grafico si può osservare come non ci sia una netta tendenza tra le due, ma è vero che nel periodo tra il 1925 - 1940 l'incremento dell'AMO Index si riflette anche in un aumento delle anomalie di temperatura della Groenlandia, e allo stesso modo si nota un decremento sia delle anomalie che dell'AMO Index nei decenni successivi. Si nota infine, un andamento divergente negli anni 2000 tra l'AMO Index e le anomalie di temperatura, questo potrebbe essere influenzato dall'aumento delle anomalie di temperature globali che si sono registrate a partire del 21° secolo.

Per concludere si può quindi affermare che da una semplice visualizzazione grafica l'AMO Index ha una certa influenza sulle temperature della Groenlandia, nell'ultimo decennio però l'andamento divergente che si nota dal grafico potrebbe essere indicativo di un cambio di passo rispetto all'influenza che l'AMO esercita.

Sviluppi futuri¶

  • Aggiornare con dati più recenti, per vedere l'andamento negli ultimi anni;
  • Sviluppare un modello per la correzione dei dati mancanti;
  • Prendere in considerazione l'andamento dello scioglimento dei ghiacci in Groenlandia con le fasi dell'AMO Index.